Skip to content

Fix token usage not displayed in Namer result#1225

Open
aslamdoctor wants to merge 2 commits intoWordPress:trunkfrom
aslamdoctor:fix/1220-token-usage-display
Open

Fix token usage not displayed in Namer result#1225
aslamdoctor wants to merge 2 commits intoWordPress:trunkfrom
aslamdoctor:fix/1220-token-usage-display

Conversation

@aslamdoctor
Copy link

@aslamdoctor aslamdoctor commented Mar 17, 2026

Summary

Fixes #1220

  • Root cause: WP_AI_Client_Prompt_Builder provides methods like generate_text_result() via PHP's __call() magic method. The code used method_exists() which returns false for magic methods, so generate_text_result() was never called. Instead, it always fell through to generate_text() which returns a plain string with no token usage data — resulting in "undefined total" in the UI.
  • Fix: Replace method_exists() with is_callable() in execute_ai_request() so the rich result path is used and token usage is returned.
  • Fallback: Compute total_tokens from prompt_tokens + completion_tokens in PHP if not directly available, and add defensive JS handling to prevent "undefined total" display.

Screenshot

image

Test plan

  • Go to WP Admin → Tools → Plugin Check Namer
  • Enter a plugin name and click Evaluate name
  • Verify the Tokens used line displays actual numeric values (e.g. 1,234 total (850 prompt + 384 completion))
  • Verify the similar name query token count is also displayed when available

…#1220)

The token usage was always showing "undefined total" because
generate_text_result() was never called on the AI client. The
WP_AI_Client_Prompt_Builder provides its methods via PHP's __call()
magic method, but the code used method_exists() which returns false
for magic methods. This caused the code to always fall through to
generate_text() which returns a plain string without token usage data.

Changes:
- Replace method_exists() with is_callable() in execute_ai_request()
  so generate_text_result() is properly detected and returns a
  GenerativeAiResult object that includes token usage.
- Compute total_tokens from prompt_tokens + completion_tokens as a
  fallback if the AI client doesn't provide it directly.
- Add defensive JS handling to prevent "undefined total" display when
  total_tokens is missing from the response.
@github-actions
Copy link

github-actions bot commented Mar 17, 2026

The following accounts have interacted with this PR and/or linked issues. I will continue to update these lists as activity occurs. You can also manually ask me to refresh this list by adding the props-bot label.

If you're merging code through a pull request on GitHub, copy and paste the following into the bottom of the merge commit message.

Co-authored-by: aslamdoctor <aslamdoctor@git.wordpress.org>
Co-authored-by: davidperezgar <davidperez@git.wordpress.org>
Co-authored-by: raftaar1191 <raftaar1191@git.wordpress.org>

To understand the WordPress project's expectations around crediting contributors, please review the Contributor Attribution page in the Core Handbook.

@raftaar1191
Copy link
Contributor

raftaar1191 commented Mar 18, 2026

@aslamdoctor

here with the token I am also see the model name. I think here should be one more section call as Model used

Now:
image

How I guess it should be:
image

@davidperezgar what do you think?

@davidperezgar
Copy link
Member

Look nice!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Token usage not displayed in Plugin Check Namer result

3 participants